62 research outputs found

    Is my configuration any good: checking usability in an interactive sensor-based activity monitor

    Get PDF
    We investigate formal analysis of two aspects of usability in a deployed interactive, configurable and context-aware system: an event-driven, sensor-based homecare activity monitor system. The system was not designed from formal requirements or specification: we model the system as it is in the context of an agile development process. Our aim was to determine if formal modelling and analysis can contribute to improving usability, and if so, which style of modelling is most suitable. The purpose of the analysis is to inform configurers about how to interact with the system, so the system is more usable for participants, and to guide future developments. We consider redundancies in configuration rules defined by carers and participants and the interaction modality of the output messages.Two approaches to modelling are considered: a deep embedding in which devices, sensors and rules are represented explicitly by data structures in the modelling language and non-determinism is employed to model all possible device and sensor states, and a shallow embedding in which the rules and device and sensor states are represented directly in propositional logic. The former requires a conventional machine and a model-checker for analysis, whereas the latter is implemented using a SAT solver directly on the activity monitor hardware. We draw conclusions about the role of formal models and reasoning in deployed systems and the need for clear semantics and ontologies for interaction modalities

    A specialised constraint approach for stable matching problems

    Get PDF
    Constraint programming is a generalised framework designed to solve combinatorial problems. This framework is made up of a set of predefined independent components and generalised algorithms. This is a very versatile structure which allows for a variety of rich combinatorial problems to be represented and solved relatively easily. Stable matching problems consist of a set of participants wishing to be matched into pairs or groups in a stable manner. A matching is said to be stable if there is no pair or group of participants that would rather make a private arrangement to improve their situation and thus undermine the matching. There are many important "real life" applications of stable matching problems across the world. Some of which includes the Hospitals/Residents problem in which a set of graduating medical students, also known as residents, need to be assigned to hospital posts. Some authorities assign children to schools as a stable matching problem. Many other such problems are also tackled as stable matching problems. A number of classical stable matching problems have efficient specialised algorithmic solutions. Constraint programming solutions to stable matching problems have been investigated in the past. These solutions have been able to match the theoretically optimal time complexities of the algorithmic solutions. However, empirical evidence has shown that in reality these constraint solutions run significantly slower than the specialised algorithmic solutions. Furthermore, their memory requirements prohibit them from solving problems which the specialised algorithmic solutions can solve in a fraction of a second. My contribution investigates the possibility of modelling stable matching problems as specialised constraints. The motivation behind this approach was to find solutions to these problems which maintain the versatility of the constraint solutions, whilst significantly reducing the performance gap between constraint and specialised algorithmic solutions. To this end specialised constraint solutions have been developed for the stable marriage problem and the Hospitals/Residents problem. Empirical evidence has been presented which shows that these solutions can solve significantly larger problems than previously published constraint solutions. For these larger problem instances it was seen that the specialised constraint solutions came within a factor of four of the time required by algorithmic solutions. It has also been shown that, through further specialisation, these constraint solutions can be made to run significantly faster. However, these improvements came at the cost of versatility. As a demonstration of the versatility of these solutions it is shown that, by adding simple side constraints, richer problems can be easily modelled. These richer problems add additional criteria and/or an optimisation requirement to the original stable matching problems. Many of these problems have been proven to be NP-Hard and some have no known algorithmic solutions. Included with these models are results from empirical studies which show that these are indeed feasible solutions to the richer problems. Results from the studies also provide some insight into the structure of these problems, some of which have had little or no previous study

    A Constraint Programming Approach to the Hospitals / Residents Problem

    Get PDF
    An instance I of the Hospitals / Residents problem (HR) involves a set of residents (graduating medical students) and a set of hospitals, where each hospital has a given capacity. The residents have preferences for the hospitals, as do hospitals for residents. A solution of I is a stable matching, which is an assignment of residents to hospitals that respects the capacity conditions and preference lists in a precise way. In this paper we present constraint encodings for HR that give rise to important structural properties. We also present a computational study using both randomly-generated and real-world instances. Our study suggests that Constraint Programming is indeed an applicable technology for solving this problem, in terms of both theory and practice

    Pharmacological actions of nobiletin in the modulation of platelet function

    Get PDF
    Background and Purpose The discovery that flavonoids are capable of inhibiting platelet function has led to their investigation as potential antithrombotic agents. However, despite the range of studies on the antiplatelet properties of flavonoids, little is known about the mechanisms by which flavonoids inhibit platelet function. In this study, we aimed to explore the pharmacological effects of a polymethoxy flavonoid, nobiletin in the modulation of platelet function. Experimental Approach The ability of nobiletin to modulate platelet function was explored by using a range of in vitro and in vivo experimental approaches. Aggregation, dense granule secretion and spreading assays were performed using washed platelets. The fibrinogen binding, α-granule secretion and calcium mobilisation assays were performed using platelet-rich plasma and whole blood was used in impedance aggregometry and thrombus formation experiments. The effect of nobiletin in vivo was assessed by measuring tail bleeding time using C57BL/6 mice. Key Results Nobiletin was shown to supress a range of well-established activatory mechanisms, including platelet aggregation, granule secretion, integrin modulation, calcium mobilisation and thrombus formation. Nobiletin was shown to extend bleeding time in mice and reduce the phosphorylation of Akt and PLCγ2 within the collagen receptor (GPVI) - stimulated pathway, in addition to increasing the levels of cGMP and phosphorylation of VASP, a protein whose activity is associated with inhibitory cyclic nucleotide signalling. Conclusions and Implications This study provides insight into the underlying molecular mechanisms through which nobiletin modulates haemostasis and thrombus formation. Therefore nobiletin may represent a potential antithrombotic agent of dietary origins

    Towards the Verification of Pervasive Systems

    Get PDF
    Pervasive systems, that is roughly speaking systems that can interact with their environment, are increasingly common. In such systems, there are many dimensions to assess: security and reliability, safety and liveness, real-time response, etc. So far modelling and formalizing attempts have been very piecemeal approaches. This paper describes our analysis of a pervasive case study (MATCH, a homecare application) and our proposal for formal (particularly verification) approaches. Our goal is to see to what extent current state of the art formal methods are capable of coping with the verification demand introduced by pervasive systems, and to point out their limitations

    Farnesoid X receptor and liver X receptor ligands initiate formation of coated platelets

    Get PDF
    The liver X receptors (LXRs) and farnesoid X receptor (FXR) have been identified in human platelets. Ligands of these receptors have been shown to have nongenomic inhibitory effects on platelet activation by platelet agonists. This, however, seems contradictory with the platelet hyper-reactivity that is associated with several pathological conditions that are associated with increased circulating levels of molecules that are LXR and FXR ligands, such as hyperlipidemia, type 2 diabetes mellitus, and obesity. We, therefore, investigated whether ligands for the LXR and FXR receptors were capable of priming platelets to the activated state without stimulation by platelet agonists. Treatment of platelets with ligands for LXR and FXR converted platelets to the procoagulant state, with increases in phosphatidylserine exposure, platelet swelling, reduced membrane integrity, depolarization of the mitochondrial membrane, and microparticle release observed. Additionally, platelets also displayed features associated with coated platelets such as P-selectin exposure, fibrinogen binding, fibrin generation that is supported by increased serine protease activity, and inhibition of integrin αIIbβ3. LXR and FXR ligand-induced formation of coated platelets was found to be dependent on both reactive oxygen species and intracellular calcium mobilization, and for FXR ligands, this process was found to be dependent on cyclophilin D. We conclude that treatment with LXR and FXR ligands initiates coated platelet formation, which is thought to support coagulation but results in desensitization to platelet stimuli through inhibition of αIIbβ3 consistent with their ability to inhibit platelet function and stable thrombus formation in vivo

    Multi-parameter phenotyping of platelet reactivity for stratification of human cohorts

    Get PDF
    Accurate and comprehensive assessment of platelet function across cohorts of donors may be key to understanding the risk of thrombotic events associated with cardiovascular disease, and hence help personalise the application of antiplatelet drugs. However, platelet function tests can be difficult to perform and analyse, unreliable or uninformative and poorly standardised across studies. The Platelet Phenomic Analysis (PPAnalysis) assay and associated open-source software platform was developed in response to these challenges. PPAnalysis utilises pre-prepared freeze-dried microtitre plates to provide a detailed characterisation of platelet function. The automated analysis of the high-dimensional data enables the identification of sub-populations of donors with distinct platelet function phenotypes. Using this approach we identified that the Sensitivity of a donor’s platelets to an agonist and their Capacity to generate a functional response are distinct independent metrics of platelet reactivity. Hierarchical clustering of these metrics identified six subgroups with distinct platelet phenotypes within healthy cohorts, indicating that platelet reactivity does not fit into the traditional simple categories of ’high’ and ’low’ responders. These platelet phenotypes were found to exist in two independent cohorts of healthy donors and were stable on recall. PPAnalysis is a powerful tool for stratification of cohorts on the basis of platelet reactivity which will enable investigation of the causes and consequences of differences in platelet function and drive progress towards precision medicine

    Factors affecting the microwave coking of coals and the implications on microwave cavity design

    Get PDF
    The work carried out in this paper assessed how processing conditions and feedstock affect the quality of the coke produced during microwave coke making. The aim was to gather information that would support the development of an optimised microwave coke making oven. Experiments were carried out in a non-optimised 2450 MHz cylindrical cavity. The effect of treatment time (15–120 min), power input (750 W–4.5 kW) and overall power input (1700–27,200 kWh/t) on a range of coals (semi-bituminous–anthracite) was investigated. Intrinsic reactivity, random reflectance, strength index and dielectric properties of the produced cokes were compared with those of two commercial cokes to assess the degree of coking produced in the microwave system. Overall energy input and coal rank were found to be the major factors determining the degree of coking following microwave treatment. The dependency on coal rank was attributed to the larger amount of volatiles that had to be removed from the lower ranked coals, and the increasing dielectric loss of the organic component of the coal with rank due to increased structural ordering. Longer treatment times at lower powers or shorter treatment times at higher powers are expected to produce the same degree of coking. It was concluded that microwave coke making represents a potential step-change in the coking industry by reducing treatment times by an order of magnitude, introducing flexibility and potentially decreasing the sensitivity to quality requirement in the feedstock. The main challenges to development are the energy requirements (which will need to be significantly reduced in an optimised process) and penetration depth (which will require an innovative reactor design to maximise the advantage of using microwaves). Understanding and quantifying the rapidly changing dielectric properties of the coal and coke materials is vital in addressing both of these challenges
    corecore